Latent Subspace Clustering based on Deep Neural Networks
نویسندگان
چکیده
Clustering approaches have been widely used in process control community for unsupervised classification beneficial for further analysis, modeling and optimization. Process data generally involve far more dimensions than needed; this phenomenon is called as ”data rich but information poor” and becomes obstacles for reasonable classification. Therefore, it is desirable to use latent variable models such as principal component analysis (PCA) to lower the dimension of data. Traditional clustering models, however, are directly established on the data and make no allowance for latent subspace, which would cause inaccuracy in unsupervised data classification. In recent years deep neural networks (DNN) have proved effective for developing latent variable models, which is termed as the “deep learning” technique. In this paper, we propose a novel clustering approach based on a combination of DNN and traditional K-means method. DNN is responsible for latent subspace description within the data, and the K-means method is used for clustering in the derived latent subspace. The proposed method has better generalization performance due to its strong nonlinear representation ability, and it is especially favored in the case of high-dimensional data with significant correlations. The efficacy of the proposed method is addressed on two benchmark data sets in comparison with traditional clustering approaches.
منابع مشابه
Deep Subspace Clustering Networks
We present a novel deep neural network architecture for unsupervised subspace clustering. This architecture is built upon deep auto-encoders, which non-linearly map the input data into a latent space. Our key idea is to introduce a novel self-expressive layer between the encoder and the decoder to mimic the “selfexpressiveness” property that has proven effective in traditional subspace clusteri...
متن کاملDeep Sparse Subspace Clustering
In this paper, we present a deep extension of Sparse Subspace Clustering, termed Deep Sparse Subspace Clustering (DSSC). Regularized by the unit sphere distribution assumption for the learned deep features, DSSC can infer a new data affinity matrix by simultaneously satisfying the sparsity principle of SSC and the nonlinearity given by neural networks. One of the appealing advantages brought by...
متن کاملمعرفی شبکه های عصبی پیمانه ای عمیق با ساختار فضایی-زمانی دوگانه جهت بهبود بازشناسی گفتار پیوسته فارسی
In this article, growable deep modular neural networks for continuous speech recognition are introduced. These networks can be grown to implement the spatio-temporal information of the frame sequences at their input layer as well as their labels at the output layer at the same time. The trained neural network with such double spatio-temporal association structure can learn the phonetic sequence...
متن کاملDeep Subspace Clustering with Sparsity Prior
Subspace clustering aims to cluster unlabeled samples into multiple groups by implicitly seeking a subspace to fit each group. Most of existing methods are based on a shallow linear model, which may fail in handling data with nonlinear structure. In this paper, we propose a novel subspace clustering method – deeP subspAce clusteRing with sparsiTY prior (PARTY) – based on a new deep learning arc...
متن کاملExploring Linear Relationship in Feature Map Subspace for ConvNets Compression
While the research on convolutional neural networks (CNNs) is progressing quickly, the real-world deployment of these models is often limited by computing resources and memory constraints. In this paper, we address this issue by proposing a novel filter pruning method to compress and accelerate CNNs. Our work is based on the linear relationship identified in different feature map subspaces via ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014